The Hilbert Kernel Regression Estimate

نویسندگان
چکیده

منابع مشابه

Quantile Regression in Reproducing Kernel Hilbert Spaces

In this paper we consider quantile regression in reproducing kernel Hilbert spaces, which we refer to as kernel quantile regression (KQR). We make three contributions: (1) we propose an efficient algorithm that computes the entire solution path of the KQR, with essentially the same computational cost as fitting one KQR model; (2) we derive a simple formula for the effective dimension of the KQR...

متن کامل

Subspace Regression in Reproducing Kernel Hilbert Space

We focus on three methods for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case a least squares support vector machine style der...

متن کامل

Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space

A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling th...

متن کامل

Strong Consistency of Kernel Regression Estimate

In this paper, regression function estimation from independent and identically distributed data is considered. We establish strong pointwise consistency of the famous Nadaraya-Watson estimator under weaker conditions which permit to apply kernels with unbounded support and even not integrable ones and provide a general approach for constructing strongly consistent kernel estimates of regression...

متن کامل

Kernel Density Based Linear Regression Estimate

For linear regression models with non-normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel den...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Multivariate Analysis

سال: 1998

ISSN: 0047-259X

DOI: 10.1006/jmva.1997.1725